Goto

Collaborating Authors

 gradient-em bayesian meta-learning


Gradient-EM Bayesian Meta-Learning

Neural Information Processing Systems

Bayesian meta-learning enables robust and fast adaptation to new tasks with uncertainty assessment. The key idea behind Bayesian meta-learning is empirical Bayes inference of hierarchical model. In this work, we extend this framework to include a variety of existing methods, before proposing our variant based on gradient-EM algorithm. Our method improves computational efficiency by avoiding back-propagation computation in the meta-update step, which is exhausting for deep neural networks. Furthermore, it provides flexibility to the inner-update optimization procedure by decoupling it from meta-update. Experiments on sinusoidal regression, few-shot image classification, and policy-based reinforcement learning show that our method not only achieves better accuracy with less computation cost, but is also more robust to uncertainty.


Review for NeurIPS paper: Gradient-EM Bayesian Meta-Learning

Neural Information Processing Systems

Additional Feedback: I think the technical novelty of the proposed algorithm is somewhat limited since the resulting algorithm is essentially a Bayesian version of reptile (GEM-BML using L [1] loss). Nevertheless, I like the reinterpretation given in the paper, especially the co-ordinate decent view of meta-update decoupling the inner-level update and outer-level update. Any argument highlighting the technical novelty of the proposed method would be appreciated. The important aspect of the proposed method is its robustness due to being Bayesian. I think the paper could be strengthened by including more experiments to see this aspect by testing performance or calibration under distributional shift. The performance of GEM-BML and GEM-BML is not that impressive for typical few-shot learning settings (the difference with baselines are not significant in a statistical sense for some settings).


Review for NeurIPS paper: Gradient-EM Bayesian Meta-Learning

Neural Information Processing Systems

This paper makes a solid contribution to the meta-learning area that is computational efficient. The reviewers also praised the execution and were largely convinced by the reported results. The authors also clarified the concerns raised by the reviewers in their rebuttal. The presentation could be improved, but is acceptable.

  gradient-em bayesian meta-learning, neurips paper

Gradient-EM Bayesian Meta-Learning

Neural Information Processing Systems

Bayesian meta-learning enables robust and fast adaptation to new tasks with uncertainty assessment. The key idea behind Bayesian meta-learning is empirical Bayes inference of hierarchical model. In this work, we extend this framework to include a variety of existing methods, before proposing our variant based on gradient-EM algorithm. Our method improves computational efficiency by avoiding back-propagation computation in the meta-update step, which is exhausting for deep neural networks. Furthermore, it provides flexibility to the inner-update optimization procedure by decoupling it from meta-update.

  gradient-em bayesian meta-learning